A New Perspective of Proximal Gradient Algorithms

نویسندگان

  • Yi Zhou
  • Yingbin Liang
  • Lixin Shen
چکیده

We provide a new perspective to understand proximal gradient algorithms. We show that both proximal gradient algorithm (PGA) and Bregman proximal gradient algorithm (BPGA) can be viewed as generalized proximal point algorithm (GPPA), based on which more accurate convergence rates of PGA and BPGA are obtained directly. Furthermore, based on GPPA framework, we incorporate the back-tracking line search scheme into PGA and BPGA, and analyze the convergence rate with numerical verification.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Proximal Gradient Algorithms for Multi-Source Quantitative Photoacoustic Tomography

The development of accurate and efficient image reconstruction algorithms is a central aspect of quantitative photoacoustic tomography (QPAT). In this paper, we address this issues for multi-source QPAT using the radiative transfer equation (RTE) as accurate model for light transport. The tissue parameters are jointly reconstructed from the acoustical data measured for each of the applied sourc...

متن کامل

Iteration Bounds for Finding the -Stationary Points for Structured Nonconvex Optimization

In this paper we study proximal conditional-gradient (CG) and proximal gradient-projection type algorithms for a block-structured constrained nonconvex optimization model, which arises naturally from tensor data analysis. First, we introduce a new notion of -stationarity, which is suitable for the structured problem under consideration. We then propose two types of first-order algorithms for th...

متن کامل

Adaptive Fista

In this paper we propose an adaptively extrapolated proximal gradient method, which is based on the accelerated proximal gradient method (also known as FISTA), however we locally optimize the extrapolation parameter by carrying out an exact (or inexact) line search. It turns out that in some situations, the proposed algorithm is equivalent to a class of SR1 (identity minus rank 1) proximal quas...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

A New Strategy for Training RBF Network with Applications to Nonlinear Integral Equations

A new learning strategy is proposed for training of radial basis functions (RBF) network. We apply two different local optimization methods to update the output weights in training process, the gradient method and a combination of the gradient and Newton methods. Numerical results obtained in solving nonlinear integral equations show the excellent performance of the combined gradient method in ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015